dental medicine (Noun) — The branch of medicine dealing with the anatomy and development and diseases of the teeth.